![]() method for using a portable electronic device
专利摘要:
METHOD FOR USING A PORTABLE ELECTRONIC DEVICE A portable electronic device having an input device for receiving gesture-based input from a user is used to control a navigation operation from a device. The handheld electronic device receives, via the input device, gesture-based input and uses one or more parameters stored in a handheld electronic device's memory and one or more characteristics associated with the gesture-based input to make the handheld electronic device to transmit a navigation step command and thereby control the navigation operation of the device. 公开号:BR112012004844B1 申请号:R112012004844-9 申请日:2010-08-31 公开日:2021-02-17 发明作者:Arsham Hatambeiki;Jeffrey Kohanek;Pamela Eichler Keiles;Patrick H. Hayes 申请人:Universal Electronics Inc.; IPC主号:
专利说明:
[0001] [0001] Control devices, for example, remote controls, for use in issuing commands for entertainment and other devices, and the features and functionality provided by such control devices are well known in the art. Traditionally, means of user input on such control devices comprised a series of buttons each of which can result in the transmission of a specific command when activated. Increasingly in today's environments, such control devices need to be used to interact with the displayed menu systems, navigate web pages, manipulate pointers, and perform other similar activities that may require movement, eg, scrolling through information displayed on a screen, move a pointer, to control a game or avatar activity, to zoom in / out, to control functions such as fast forward or slow motion, or the like (such activities collectively referred to hereinafter as "navigation" ). Although control of navigation functions is possible using conventional control device input mechanisms, such as a group of up, down, left and right arrow keys, in many circumstances the user experience can be improved by providing an entry mechanism that is best suited to this type of activity. Additionally, in some cases the user experience can be further enhanced by providing adjustment of various aspects of the navigation input system, such as sensitivity, speed, etc., according to personal preference. SUMMARY OF THE INVENTION [0002] [0002] In accordance with this and other needs, the following discussed in general describes a system and method for providing improved navigation input functionality in a control device. For this purpose, in addition to a conventional key matrix for receiving button inputs as is well known in the art, a control device can be provided with specific input means of navigation such as, for example, a mechanical scroll wheel, resistive or capacitive touch sensor, etc., and thereby movement and / or pressure by a user's finger can be translated into a repetitive series of step commands transmitted to a target controlled device. These commands can be applied to the target device to control navigation-related operations, such as scrolling a menu, moving a cursor on the screen, moving a game object, etc., as appropriate for a particular application. A repetitive step rate output by the control device can be variable depending on the movement speed and / or pressure intensity, and can also include non-linear response factors such as acceleration / deceleration as well as virtual button sensing. The parameters controlling such response factors can be adjustable per user to match individual preferences. [0003] [0003] A better understanding of objects, advantages, resources, properties and relationships of the invention will be obtained from the following detailed description and attached drawings that establish illustrative modalities and that indicate the various ways in which the principles of the invention can be employed. BRIEF DESCRIPTION OF THE DRAWINGS [0004] [0004] For a better understanding of the various aspects of the invention, reference can be made to the preferred modalities as shown in the attached drawings in which: Figure 1 illustrates an exemplary system in which an exemplary control device according to the present invention can be used; Figure 2 illustrates a block diagram of the exemplary components of the exemplary control device of Figure 1; Figure 3 further illustrates the exemplary control device in Figure 1; Figure 4 illustrates exemplary methods for reporting functions of the navigation step from a control device to a target device; and Figure 5 illustrates the definition of an exemplary virtual button pressure zone for the navigation input sensing area of an exemplary control device. DETAILED DESCRIPTION [0005] [0005] Now returning to Figure 1, an exemplary system is illustrated in which a control device 100 is configured to control various controllable devices, such as a television 102 and a console box ("STB") 104. As is known in art, the control device 100 is capable of transmitting commands to the devices, using any convenient protocol of IR, RF, Point to Point, or in network, to make the devices perform operational functions. While illustrated in the context of a 102 and STB 104 television, it is to be understood that controllable devices may include, but need not be limited to, televisions, VCRs, DVRs, DVD players, cable or satellite converter boxes ("STBs "), amplifiers, CD players, game consoles, home lighting, drapery, fans, HVAC systems, thermostats, personal computers, etc. In a particular illustrative embodiment, in addition to conventional control functionality as is well known in the art, control device 100 may further include an input area 106 for generating navigation commands for transmission from control device 100 to one or more devices in response to user interaction with that area, used for example to scroll a program guide menu display 108 on TV 102 by issuing a series of step commands to console box 104 as will be described in further detail from now on. [0006] [0006] With reference to Figure 2, for use in commanding the functional operations of one or more devices, the control device 100 may include, as necessary for a particular application, a processor 200 coupled with a ROM 204 memory; a RAM 202 memory; a 216 key matrix (eg, hard keys, soft keys such as a touch sensitive surface superimposed on a liquid crystal (LCD), and / or an electroluminescent display (EL)); a scroll and / or navigation function input means 218 such as a capacitive or resistive touch sensor, scroll wheel, etc .; transmission circuit (s) and / or transceiver circuit (s) 210 (eg, IR and / or RF); a non-volatile read / write memory 206; a means 220 for providing visual feedback to the user (e.g., one or more LEDs, monitor, and / or the like); means 222 for providing audible feedback to a user (eg, a loudspeaker, piezo-electric siren, etc.); a power source 208; an input / output port 224 such as a serial interface, USB port, modem, Zigbee, WiFi, or Bluetooth transceiver, etc .; a biometric input device 226 such as a fingerprint recognition pad, hand shake detector, etc .; and clock and logical time counter 212 with crystal or associated resonant device 214. [0007] [0007] As will be understood by those skilled in the art, some or all memories 202, 204, 206 may include executable instructions (collectively, the program memory) that are intended to be executed by processor 200 to control the operation of the control remote 100, as well as data used to define the necessary control protocols and command values for operating software for use in transmitting command signals to controllable devices (collectively, the command data). In this way, processor 200 can be programmed to control the various electronic components within remote control 100, e.g., to monitor key matrix 216, to cause signal transmission, etc. Non-volatile read / write memory 206, for example an EEPROM, battery supported RAM, FLASH, Smart Card, memory stick, or the like, can additionally be provided to store data and configuration parameters as needed. While memory 204 is illustrated and described as a ROM memory, memory 204 can also be comprised of any type of readable media, such as ROM, FLASH, EEPROM, or the like. Preferably, memories 204 and 206 are non-volatile or battery supported such that data is not required to be recharged after battery changes. In addition, memories 202, 204 and 206 can take the form of a chip, a hard disk, a magnetic disk, an optical disk, and / or the like. Still further, it will be appreciated that some or all of the illustrated memory devices can be physically combined (for example, a single flash memory can be logically divided into different portions to support the memory functionality 204 and 206 respectively), and / or can be physically embedded within the same IC chip as microprocessor 200 (a so-called "microcontroller") and, as such, they are shown separately in Fig. 2 only for the sake of clarity. [0008] [0008] To make the control device 100 perform an action, the control device 100 is adapted to be responsive to events, such as a perceived user interaction with the key matrix 216, etc. In response to an event, appropriate instructions within the program memory (hereinafter the "operating program") can be executed. For example, when a function key is triggered on the control device 100, the control device 100 can retrieve a command value and control protocol corresponding to the function key from the command data stored in memory 202, 204, 206 triggered and, where necessary, current device mode and will use the retrieved command data to transmit to a desired target device, eg STB 104, a command in a recognizable format through which the device and thereby control one or more functional operations of that device. It will be appreciated that the operating program can be used not only to cause the transmission of commands and / or data to the devices, but also to carry out local operations. While not limiting, the local operations that can be performed by the control device 100 may include information / display data, favorite channel configuration, macro key configuration, function key reallocation, etc. Examples of local operations can be found in U.S. Patents Nos. 5,481,256, 5,959,751, and 6,014,092. [0009] [0009] In some embodiments, control device 100 may be of the universal type, which is provided with a library, comprising a multiplicity of command codes and protocols suitable for controlling various devices. In such cases, to select command data sets to be associated with the specific devices to be controlled (hereinafter referred to as a configuration procedure), data can be entered into the control device 100 which serves to identify each intended target device. through your brand, and / or models, and / or type. The data can typically be entered via activation of those keys that are also used to cause the transmission of commands to a device, preferably the keys that are labeled with numerals. Such data allows the control device 100 to identify the appropriate command data set within the command data library that is to be used to transmit recognizable commands in formats appropriate for such identified devices. The control data library can represent a large number of controllable devices of different types and manufacturers, a large number of controllable devices of the same type but from different manufacturers, a large number of devices from the same manufacturer but of different types or models, etc. ., or any combination thereof as appropriate for a given modality. In conventional practice as is well known in the art, such data used to identify an appropriate command data set can take the form of a numerical configuration code (obtained, for example, from a printed list of names and / or models manufacturers with corresponding code numbers, from a support website, etc.). Alternative configuration procedures known in the art include reading bar codes, sequentially transmitting a predetermined command in different formats until a target device response is detected, interaction with a website culminating in the download of command data and / or codes settings for the control device, etc. Since such methods for configuring a control device to command the operation of specific household appliances are well known, they will not be described in more detail here. However, for additional information pertaining to the configuration procedure, the reader may refer, for example, to U.S. Patents Nos. 4,959,810, 5,614,906, or 6,225,938 all as assignees and incorporated herein for reference in their entirety. [0010] [0010] In keeping with the teachings of this invention, the control device 100 may also include an input device to accept touch input from the user to be translated into navigation commands. In an exemplary embodiment, the input device 218 can take the form of a capacitive multi-electrode touch sensor with two axes as illustrated in Figures 3 and 5. In this form, the input device 218 can accept gestures by sliding your finger on both the axes for translation in navigation step commands in an X or Y direction, as well as tapping the cardinal points and the central area for translation in discrete commands, for example, equivalent to the keys of the four arrows on the conventional keyboard and a selection key, all as will be described in further detail hereinafter. However, it will be appreciated that any other suitable technology, electrical, mechanical or a combination thereof can be used to implement the teachings of this invention, and consequently the methods described can be applied independently of the physical means of entry employed by a particular modality. [0011] [0011] Navigation step commands resulting from swipe gestures can be reported to a target device using any convenient transmission protocol, IR or RF, as known in the art. In general, such reports may include information representative of both the direction and speed of the input gesture. By way of example, without limitation, two possible methods of formatting such a reporting transmission are illustrated in figure 4 for an exemplary gesture 400, in an upward direction on the y-axis of a navigation input area 106 of a control device example 100 and with a speed / time profile as illustrated by curve 402. In an exemplary mode, navigation step commands can be reported as illustrated in 404, characterized by the fact that it comprises the transmission of a sequence of navigation direction commands (for example a "command up" 406) where the navigation direction command is repeated a variable number of times at a variable rate (T1, T2, T3, T4, etc.) which is determined as a function, for example , the current speed of gesture 400 and one or more of the stored parameters. In another exemplary modality, navigation step commands can be reported as illustrated in 408, characterized by the fact that it comprises the transmission of a sequence of variable direction delta values 410 (for example, "delta Y" = 1, 2, 3 , 4 etc.) where delta direction values are transmitted at fixed intervals Tf and characterized by the fact that delta values are determined as a function, for example, of the current gesture speed 400 and one or more of the stored parameters. In general, any suitable transmission protocol and reporting method can be used as appropriate for the target device to be controlled. It will also be appreciated that in certain modalities where control device 100 is of the universal type, more than one method of reporting navigation step commands can be supported by control device 100 and selected depending on the currently configured target device. Since methods for formatting and encoding command transmission packets are well known in the art, for the sake of simplicity they will not be described here. However, for further information regarding formats and coding schemes, the interested reader may refer, for example, to U.S. Patents 7,167,913 or 5,640,160, both of which are incorporated herein by reference in their entirety. [0012] • STEP-D: número de etapas lógicas a ser reportado para o alvo para um movimento PHY-D • PHY-D: movimento físico delta bruto reportado pelo dispositivo de entrada • PHY-MAX: resolução máxima do dispositivo de entrada. • STEP-MAX: número de etapas lógicas a ser reportado para um movimento físico máximo. [0012] To adapt the characteristics of the navigation input device 218 used to the requirements of the device to be controlled, in most modalities a translation is required from the physical input parameters received from the navigation input device 218 to a number of logical steps to be transmitted to the target device, for example STB 104. This logical step value can be dependent both on an accumulative distance value of an input gesture and on the speed of that gesture. By way of additional example, different navigation input devices may have different physical resolutions; ie, a complete path such as an end-to-end finger pressure, complete wheel rotation, maximum oscillation of electrical impedance, etc., can be translated into a different number of physical steps. This physical resolution needs to be mapped to a number of logical steps that correspond to a desired range of commands to be issued to a target device. As an example, if a navigation input device such as 218 reports 256 end-to-end increments on its x axis and the input sliding on the x axis is to be mapped in 8 logical steps, then each 32 steps of physical movement in the illustrative slide can be translated into a 1 logical step to be reported to the target. In general for linear and non-accelerated behavior this can be represented as: STEP-D = int (PHY-D * STEP-M AX / PHY-MAX) Where • STEP-D: number of logical steps to be reported to the target for a PHY-D movement • PHY-D: gross delta physical movement reported by the input device • PHY-MAX: maximum resolution of the input device. • STEP-MAX: number of logical steps to be reported for maximum physical movement. [0013] [0013] In certain modalities, physical movements can be calculated cumulatively; ie, at some sampling rates a movement may not be enough to generate a logical step report for the target, however over periods of more sample, the cumulative distance may reach the limit of a logical step and be reported to the target that moment. Also in some circumstances, "redundant" behavior may be required to ensure natural translation of physical movements into logical steps. In such cases a PHY-CR parameter, representative of redundant physical movement from the previous steps not yet applied towards the logical step, can be added as follows: STEP-D = int ((PHY-CR + PHY-D) / (PHY-MAX / STEP-MAX)) where PHY-CR = (PHY-CR + PHY-D) mod (PHY-MAX / STEP-MAX) [0014] • T-MAX: Um número de referência de períodos de amostragem para viajar a distância PHY-MAX no dispositivo de entrada, i.e., a taxa mais alta de entrada de usuário que deve resultar ma saída de dados da etapa linear com nenhuma aceleração. • T-D: períodos de amostragem passados para viajar PHY-D. • ACC-COEFF: O coeficiente de aceleração a ser aplicado para dados físicos brutos para uma experiência não linear. O valor deste coeficiente é sempre ˃=1. (Quando igual a 1, aceleração é desabilitada) [0014] In addition, in certain modalities a coefficient of acceleration coefficient can be applied to the raw data provided by the input device, based on an interaction speed. Such an acceleration coefficient can be derived, for example, as follows: ACC-COEFF = (PHY-D * T-MAX) / (TD * PHY-MAX) If (ACC-COEFF ˂ 1 Acceleration = Disabled) then ACC-COEFF = 1 Where • T-MAX: A reference number of sampling periods for traveling the PHY-MAX distance on the input device, ie, the highest user input rate that should result in linear step data output with no acceleration. • TD: sampling periods passed for traveling PHY-D. • ACC-COEFF: The acceleration coefficient to be applied to raw physical data for a non-linear experience. The value of this coefficient is always ˃ = 1. (When equal to 1, acceleration is disabled) [0015] [0015] This acceleration coefficient parameter can then be used to adjust the PHY-D value shown in the previous equations, for example: STEP-D = (PHY-CR + PHY-D * ACC-COEFF) / (PHY-MAX STEPMAX) PHY-CR = (PHY-CR + PHY-D * ACC-COEFF) mod (PHY -MAX / STEP -MAX) [0016] [0016] In certain modalities, a user option can be provided to permanently configure the acceleration coefficient for unit, i.e., always provide a linear output response regardless of the speed of the user's gesture. [0017] [0017] Such accelerated browsing behavior can only remain in effect as long as a user's finger is present on the input device. After lifting the finger from the input device, command output can go to a stop immediately with no step commands transmitted to the target, or command output can naturally decelerate over time, ie, simulating a "virtual wheel" of rotation free based on the speed at the point when the finger was lifted from the interface and a deceleration coefficient. In certain modalities, the type of behavior and / or the deceleration coefficient to be applied may have user configurable options. By way of example, in an exemplary mode, a configurable parameter DEC-COEFF can be set to be applied at a speed of the virtual wheel to slowly decrease and finally bring it to a natural stop. In one mode, the DECCOEFF range can be 0 to 0.99, where 0 will result in an immediate stop. After lifting the finger, this deceleration coefficient can be iteratively applied as follows to produce a gradual drop in the PHY-D value over time: PHY-A VE = DEC-COEFF * PHY-A VE PHY-D = PHY-A VE where the initial value of PHY-A VE is the physical mean movement in the last n sample periods before the finger has been lifted from the input device; where n is ˃ = 1. [0018] [0018] The effective number of DEC-n periods to be used can be adjusted as an additional optimization step to ensure a realistic experience. In addition, in certain modalities the natural behavior of a virtual wheel is taken into account when implementing this free rotation feature. For example, the virtual wheel can be brought to a forced stop as soon as a finger is placed back on the input device, immediately simulating a state of "DEC-COEFF = 0" if this event occurs. Other behaviors are also contemplated in various modalities of this invention, for example a short tapping of the finger on the input device may result in a temporary decrease in the DEC-COEFF deceleration coefficient thus causing a faster deceleration of the output. [0019] [0019] It will be appreciated that in certain modalities, the scrolling and navigation input device selected for use may already include built-in provisioning to support nonlinear ballistic provisioning or inertial response such as acceleration, free rotation rate of decline, etc. , such as the SO3G2010 sensor module available from Synaptics Inc. of Santa Clara, California, and in such circumstances those features can be used in place or in conjunction with some or all of the methods described above. Consequently, it is to be understood that the methods described above are presented here by way of illustration only and are not intended to be limiting. [0020] • O dedo não é movido para fora da HotZoneMax (que em uma modalidade preferida pode ser ligeiramente maior do que HotZonaMin) dentro de uma quantidade de tempo especificado por um parâmetro VirtualButtonDelay; ou • O dedo é levantado da interface sem mover para fora da HotZoneMax. 4. Se o dedo é movido para fora da HotZoneMax antes VirtualButtonDelay ter passado, a entrada pode ser processada como o início de um deslizamento de dedo.[0020] In some modalities, the navigation input device may also be required to decode finger taps by a user as virtual button presses for the generation of individual commands. For this purpose, the central crossing point and the distant ends of each of the axes can be designated as button areas in which the type of user interaction will determine from the input is to be interpreted as part of a finger slip or as a button press. As illustrated in Figure 5, the portion of each axis so designated can be defined by a pair of parameters. HotZonaMin 502 and HotZonaMax 504, applied as described hereinafter. It will be appreciated that in general the areas defined by these parameters are present in all four cardinal points and the central portion of the navigation input device, however for clarity only one such area is illustrated in figure 5. To avoid reporting button presses not without introducing uncomfortable delays in the interaction models, an exemplary set of rules of interpretation can comprise: 1. In free rotation mode (ie, when PHY-D processing declines according to the DEC-COEFF current), any touch on an interface is interpreted as a forced stop, regardless of location, and evaluated thereafter as a possible start of a slide action. 2. If not in free rotation mode, any initial contact within a defined HotZoneMin will be evaluated as a candidate for a button press. 3. In order to be interpreted as a button press, one of the following events must occur after the initial contact within a HotZoneMin area: • The finger is not moved outside of the HotZoneMax (which in a preferred mode can be slightly larger than HotZonaMin) within an amount of time specified by a VirtualButtonDelay parameter; or • The finger is lifted from the interface without moving out of the HotZoneMax. 4. If the finger is moved out of the HotZoneMax before VirtualButtonDelay has passed, the entry can be processed as the start of a finger slide. [0021] [0021] To provide feedback to the user during entry, various visible or audible signals can be used, for example flashing visual feedback device 220 such as an LED or providing click or siren sounds using the audible feedback device 222. In some embodiments , the feedback rate can be proportional to the rate on those step commands being issued to the target device. Additionally, in some modalities such feedback can be an option that can be selected by the user. [0022] • Tempo de passagem de referência abaixo do qual aceleração não é acionada (T-MAX). • Coeficiente de desaceleração e (DEC-COEFF) • Janela de medição de desaceleração (DEC-n) • Área de pressão de botão virtual (HotZoneMin, HotZoneMax) • Sensitividade de pressão de botão virtual {VirtualButtonDelay) [0022] As has been appreciated, the user's perceived behavior, responsiveness, etc., of such gesture-based input navigation devices may be dependent on multiple parameters such as, for example, with reference to the preceding illustrative modalities: • Reference passage time below which acceleration is not triggered (T-MAX). • Deceleration coefficient and (DEC-COEFF) • Deceleration measurement window (DEC-n) • Virtual button press area (HotZoneMin, HotZoneMax) • Virtual button pressure sensitivity {VirtualButtonDelay) [0023] [0023] Additionally, certain features can be enabled / disabled according to personal preferences, for example, acceleration, free rotation, visual / audible feedback, etc. As will be appreciated, for a given modality the optimal values of these parameters may vary from user to user. For example, an experienced "potential user" may prefer a highly sensitive input system characterized, for example, by a low reference line to trigger acceleration and / or a high deceleration coefficient (slow free rotation decline) along with high sensitivity. button press; while a novice user may be more comfortable with little or no acceleration coupled with fast deceleration and low button press sensitivity. [0024] Consequently, in a preferred embodiment some or all of these parameters may be adjustable and stored in the non-volatile memory 206 of the control device 100. In this way, each individual control device may be suitable to match the preferences of the primary user of that device. . In order to serve multiple users of a single configuration of equipment 102, 104, for example occupants of a house, multiple control devices 100 can be provisioned such that each can be configured to the particular individual preferences. In an alternative mode, a single control device can be adapted to store multiple sets of parameters selectable by the user. In such circumstances, the current user can, for example, identify himself to control the device via the activation of a set of user buttons 304. Alternatively, in the modalities where control device 100 is equipped with a biometric sensor 226, user identification can be done automatically. As an example, the biometric sensor device 226 may comprise a fingerprint input pad as contemplated in US Patent 6,906,696 "Method of controlling multiuser access to the functionality of consumer devices", or an accelerometer for recognizing characteristics of hand tremor of the individual user as contemplated in US Patent 7,236,156 "Methods and devices for identifying users based on tremor", both of which are incorporated herein for reference in their entirety. [0025] [0025] By means of yet another example, an exemplary control device can support user-configurable navigation input parameters and the type and variation shown in table 1 below. Initially set to the default values shown, each of these parameters can be adjusted by a user to his personal preference, for example, starting a parameter configuration mode via, simultaneously, holding down the "enter" key 302 together with a key numeric 306 corresponding to the item number to be adjusted until an audible beep indicates that the parameter setting mode has been initiated. Once in parameter setting mode, a numeric value between one and four digits (depending on the parameter being set) can be entered for storage in non-volatile memory 206, after which that value is used in all decoding and input calculations of future navigation. It will be appreciated that in modalities that support multiple users, the adjusted values can be stored in a non-volatile memory area assigned to the current user as previously identified, for example, by activating one of the 304 user selection buttons. [0026] [0026] It will be appreciated that while the exemplary modality presented above uses user keyboard inputs to perform parameter settings, several other methods can be used equally effectively where appropriate. For example, a user can configure the parameters using a PC application, either local or internet-based, and then download the resulting values to the control device using known methods as described for example in US Patent 5,953,144 or 5,537,463 both as assignees and incorporated for reference here in their entirety. Alternatively, in the modalities where a control device is equipped for two-way IR or RF communication with, for example, console box 104, parameter adjustments can be made using an interactive STB application and transferred to the control device via the communication mentioned in the two-way communication link. Additionally, in both cases above, selection of parameter values can be performed interactively via experimentation - for example, a specially designed STB or PC application can allow the user to interact with an exemplary screen display in a calibration mode with the parameters control device all at their default values, and automatically determine the optimal settings for that user based on the observed metrics, such as reaction time, positioning accuracy, speed of input gestures, etc. [0027] [0027] In certain modalities in which the control device 100 is adapted to support two-way communication with a target device such as STB 104, further refinements to parameter values can be implemented. For example, instead of or as a supplement to the mentioned calibration mode, a target device can monitor the user's interactions and make dynamic and adaptive adjustments to the parameters stored in the control device via the two-way communication link. For example, if a user is observed to consistently overtake when scrolling a program guide display, the acceleration and / or deceleration coefficients can be automatically adjusted by applying STB to improve the user experience. In yet another modality, a control device can store multiple parameter sets corresponding to the various activities, for example one set optimized for menu navigation and another set optimized for playing games, with the parameter set currently in use being selected by the STB based on current activity being carried out. Alternatively, only a single set of basic parameters can be stored, and a scale factor related to the activity provided by the STB. [0028] [0028] In some contemplated modalities, not all parameters may necessarily be revealed to a user of the control device and made available for adjustment. For example, for the sake of simplicity, certain parameter values can be set by the manufacturer or supplier of the control device, based, for example, on the application and / or target device with which the control device is intended to be used, and pre-configured at manufacturing or installation time. [0029] [0029] While several concepts have been described in detail, it will be appreciated by those skilled in the art that various modifications and alternatives to those concepts could be developed in the light of the general teachings of dissemination. [0030] [0030] Additionally, while described in the context of functional modules and illustrated using block diagram format, it is to be understood that, unless otherwise stated, one or more of the functions and / or features described can be integrated on a single physical device and / or a software module, or one or more functions and / or features can be implemented on separate physical devices or software modules. It will also be appreciated that a detailed discussion of the effective implementation of each module is not necessary for an allowed understanding of the invention. More properly, the effective implementation of such modules would be well within the routine skill of an engineer, given the disclosure here of the attributes, functionality, and interrelationship of the various functional modules in the system. Consequently, a person skilled in the art, applying simple skill, will be able to practice the invention set out in the claims without undue experimentation. It will be further appreciated that the particular concepts disclosed are significant to be illustrative only and not to limit the scope of the invention which is to be given the full scope of the appended claims and any equivalents thereof. [0031] [0031] All patents cited within this document are hereby incorporated by reference in their entirety.
权利要求:
Claims (15) [0001] Method for using a portable electronic device (100), having an input device (106) to receive a gesture-based input from a user, to control a navigation operation from an apparatus (102, 104), comprising: - receiving the gesture-based input via the input device (106) of the portable electronic device; and - causing the portable electronic device (100) to transmit a navigation step command (404) to thereby control the navigation operation of the device; characterized by the fact that the navigation step command (404) transmitted by the portable electronic device comprises a steering command (406) which is transmitted from the portable electronic device (100), in which the transmission of the steering command (406) ) a variable number of times is repeated at a variable rate, and the variable number of times and the variable rate are determined as a function of one or more characteristics associated with gesture-based input and one or more parameters stored in memory (202, 204, 206) of the portable electronic device (100). [0002] Method according to claim 1, characterized in that the input device (106) comprises a touch-sensitive surface. [0003] Method according to claim 2, characterized in that the touch sensitive surface comprises a capacitive touch sensor in two axes. [0004] Method according to claim 1, characterized by the fact that it comprises providing user feedback via the portable electronic device representative of the gesture-based input [0005] Method according to claim 1, characterized by the fact that it comprises setting the parameters on the portable electronic device using user input received on the portable electronic device. [0006] Method according to claim 1, characterized by the fact that it comprises establishing the parameters on a remote device from the portable electronic device and loading the parameters from the remote device of the portable electronic device to the portable electronic device to store in the memory of the portable electronic device. [0007] Method according to claim 6, characterized by the fact that the parameters are established on the remote device from the portable device via the use of the portable electronic device to control navigation operations of the remote device from the portable electronic device. [0008] Method according to claim 1, characterized by the fact that the parameters are loaded from the device to the portable electronic device to store in the memory of the portable electronic device. [0009] Method according to claim 1, characterized by the fact that parameters stored in the memory of the portable electronic device are made to be dynamically exchanged via communications received from the device. [0010] Method according to claim 1, characterized in that it comprises making the portable electronic device transmit to the device, a discrete command in response to gesture-based input including a tapping gesture perceived in the input device. [0011] Method according to claim 10, characterized by the fact that parameters stored in the memory of the portable electronic device still work to define one or more regions of the input device to receive the tapping gesture. [0012] Method according to claim 1, characterized by the fact that parameters stored in the memory of the portable electronic device comprise at least one between a reference passage time below which acceleration is not activated, a deceleration coefficient and a measurement window of slowdown. [0013] Method according to claim 1, characterized by the fact that characteristics associated with the gesture-based input comprise at least one among an acceleration characteristic associated with the gesture-based input, a duration characteristic associated with the gesture-based input, and a location feature associated with gesture-based input. [0014] Method according to claim 1, characterized by the fact that the portable electronic device includes a parameter library and input is received on the portable electronic device to select parameters from the parameter library to use in the step of making the electronic device hand transmits a navigation step command to thereby control the navigation operation of the device. [0015] Method according to claim 14, characterized by the fact that the input received in the portable electronic device for selecting parameters from the parameter library comprises input received via a biometric input sensor of the portable electronic device and a transmission received from the device.
类似技术:
公开号 | 公开日 | 专利标题 BR112012004844B1|2021-02-17|method for using a portable electronic device JP5669393B2|2015-02-12|Programmable multimedia controller remote control unit JP6083072B2|2017-02-22|Smart air mouse EP3367186B1|2020-10-07|System and method for interactive appliance control BR112012028953B1|2020-09-24|SYSTEM FOR ENHANCED REMOTE CONTROL FUNCTIONALITY. CN102473040B|2015-05-27|Multi-dimensional controlling device CN102460367A|2012-05-16|Directional touch remote BR112012015405B1|2021-03-02|control device and method for using a control device CN104054331A|2014-09-17|Features for use with multi-sided controlling device CN107172471A|2017-09-15|Realize the device and display device for running touch-screen applications on the display device CN112740146A|2021-04-30|Gesture recognition system
同族专利:
公开号 | 公开日 US9477402B2|2016-10-25| EP2473991A1|2012-07-11| US10089008B2|2018-10-02| WO2011028692A1|2011-03-10| EP2473991B1|2016-04-27| US20130241715A1|2013-09-19| US9335923B2|2016-05-10| US20180129412A1|2018-05-10| US9927972B2|2018-03-27| US10031664B2|2018-07-24| US20130246978A1|2013-09-19| US20130241825A1|2013-09-19| US20110055772A1|2011-03-03| US20160110039A1|2016-04-21| US20130254722A1|2013-09-26| US9086739B2|2015-07-21| US9323453B2|2016-04-26| EP3062307A1|2016-08-31| US20130241876A1|2013-09-19| US20130246979A1|2013-09-19| US9261976B2|2016-02-16| US8438503B2|2013-05-07| EP2473991A4|2012-11-07| CN102598110A|2012-07-18| US20150346999A1|2015-12-03| EP3062307B1|2017-01-04| BR112012004844A2|2020-10-06| US9250715B2|2016-02-02| US9134815B2|2015-09-15| US20130254721A1|2013-09-26| CN102598110B|2015-04-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US3699522A|1966-09-20|1972-10-17|Gen Signal Corp|Locomotive radio control system with address and command signals| DE1763685A1|1968-07-18|1971-11-25|Messerschmitt Boelkow Blohm|Procedure for transferring commands| CH534446A|1971-07-26|1973-02-28|Landis & Gyr Ag|Ripple control receiver| US4090201A|1976-09-08|1978-05-16|Harris Corporation|Rate augmented step track system| US4623887A|1984-05-15|1986-11-18|General Electric Company|Reconfigurable remote control| US4774511A|1985-05-30|1988-09-27|Nap Consumer Electronics Corp.|Universal remote control unit| US4703359A|1985-05-30|1987-10-27|Nap Consumer Electronics Corp.|Universal remote control unit with model identification capability| US5872562A|1985-05-30|1999-02-16|U.S. Philips Corporation|Universal remote control transmitter with simplified device identification| US4855730A|1987-05-08|1989-08-08|Rca Licensing Corporation|Component audio/video system with timed control of plural peripheral devices| US5537463A|1987-10-14|1996-07-16|Universal Electronics Inc.|Magnetic modem in a remote control| US6014092A|1987-10-14|2000-01-11|Universal Electronics Inc.|Key mover| US4959810A|1987-10-14|1990-09-25|Universal Electronics, Inc.|Universal remote control device| US5481256A|1987-10-14|1996-01-02|Universal Electronics Inc.|Direct entry remote control with channel scan| US5606443A|1993-07-28|1997-02-25|Sony Corporation|Control circuit for entertainment system demonstration| US5481258A|1993-08-11|1996-01-02|Glenayre Electronics, Inc.|Method and apparatus for coordinating clocks in a simulcast network| US5570113A|1994-06-29|1996-10-29|International Business Machines Corporation|Computer based pen system and method for automatically cancelling unwanted gestures and preventing anomalous signals as inputs to such system| KR0170326B1|1994-07-27|1999-03-30|김광호|Remote control method and apparatus| US5592604A|1994-08-31|1997-01-07|International Business Machines Corporation|Method and system for indicating boundaries of connected data subsets| US5460160A|1994-09-29|1995-10-24|Parrott; William T.|Portable golf ball warming device| US5777614A|1994-10-14|1998-07-07|Hitachi, Ltd.|Editing support system including an interactive interface| JP3153084B2|1994-11-15|2001-04-03|エスエムケイ株式会社|Pulse modulation method| JPH08286807A|1995-04-18|1996-11-01|Canon Inc|Data processing unit and method for recognizing gesture| US5614906A|1996-04-23|1997-03-25|Universal Electronics Inc.|Method for selecting a remote control command set| US6127941A|1998-02-03|2000-10-03|Sony Corporation|Remote control device with a graphical user interface| US6151208A|1998-06-24|2000-11-21|Digital Equipment Corporation|Wearable computing device mounted on superior dorsal aspect of a hand| US7586398B2|1998-07-23|2009-09-08|Universal Electronics, Inc.|System and method for setting up a universal remote control| US6157319A|1998-07-23|2000-12-05|Universal Electronics Inc.|Universal remote control system with device activated setup| US6437836B1|1998-09-21|2002-08-20|Navispace, Inc.|Extended functionally remote control system and method therefore| US6678891B1|1998-11-19|2004-01-13|Prasara Technologies, Inc.|Navigational user interface for interactive television| US6225938B1|1999-01-14|2001-05-01|Universal Electronics Inc.|Universal remote control system with bar code setup| US20060061551A1|1999-02-12|2006-03-23|Vega Vista, Inc.|Motion detection and tracking system to control navigation and display of portable displays including on-chip gesture detection| US6256019B1|1999-03-30|2001-07-03|Eremote, Inc.|Methods of using a controller for controlling multi-user access to the functionality of consumer devices| US6639584B1|1999-07-06|2003-10-28|Chuang Li|Methods and apparatus for controlling a portable electronic device using a touchpad| US6396523B1|1999-07-29|2002-05-28|Interlink Electronics, Inc.|Home entertainment device remote control| US6473135B1|2000-02-16|2002-10-29|Sony Corporation|Signal input selector for television set and method of implementing same| US6867764B2|2000-03-22|2005-03-15|Sony Corporation|Data entry user interface| US6765557B1|2000-04-10|2004-07-20|Interlink Electronics, Inc.|Remote control having touch pad to screen mapping| SE0002472L|2000-06-30|2001-12-31|Nokia Corp|Method and apparatus for selection control| US6629077B1|2000-11-22|2003-09-30|Universal Electronics Inc.|Universal remote control adapted to receive voice input| DE60117676T2|2000-12-29|2006-11-16|Stmicroelectronics S.R.L., Agrate Brianza|A method for easily extending the functionality of a portable electronic device and associated portable electronic device| JP2004525675A|2001-01-24|2004-08-26|インターリンクエレクトロニクスインコーポレイテッド|Game and home entertainment device remote control| US20050134578A1|2001-07-13|2005-06-23|Universal Electronics Inc.|System and methods for interacting with a control environment| US6914551B2|2002-04-12|2005-07-05|Apple Computer, Inc.|Apparatus and method to facilitate universal remote control| JP4109902B2|2002-05-27|2008-07-02|キヤノン株式会社|Display device| US7167913B2|2002-06-05|2007-01-23|Universal Electronics Inc.|System and method for managing communication links| JP3891143B2|2002-06-14|2007-03-14|ヤマハ株式会社|Status setting device and program| US7646372B2|2003-09-15|2010-01-12|Sony Computer Entertainment Inc.|Methods and systems for enabling direction detection when interfacing with a computer program| US7362313B2|2003-01-17|2008-04-22|3M Innovative Properties Company|Touch simulation system and method| KR100514191B1|2003-01-23|2005-09-13|삼성전자주식회사|remote controller and set-top-box for it| WO2005015943A1|2003-08-07|2005-02-17|Samsung Electronics Co., Ltd.|A/v system available for integrated control and method of controlling the same| US20050068307A1|2003-09-30|2005-03-31|Microsoft Corporation|System, method and apparatus for a media computing device remote control| US8042049B2|2003-11-03|2011-10-18|Openpeak Inc.|User interface for multi-device control| US7155305B2|2003-11-04|2006-12-26|Universal Electronics Inc.|System and methods for home appliance identification and control in a networked environment| JP4463767B2|2003-12-26|2010-05-19|パナソニック株式会社|Control signal receiver| CN1333539C|2003-12-29|2007-08-22|明基电通股份有限公司|Method for remote control of electronic apparatus| US7421656B2|2004-01-05|2008-09-02|Microsoft Corporation|Systems and methods for interacting with a user interface of a media player| US7706616B2|2004-02-27|2010-04-27|International Business Machines Corporation|System and method for recognizing word patterns in a very large vocabulary based on a virtual keyboard layout| US7173604B2|2004-03-23|2007-02-06|Fujitsu Limited|Gesture identification of controlled devices| US8320708B2|2004-04-02|2012-11-27|K-Nfb Reading Technology, Inc.|Tilt adjustment for optical character recognition in portable reading machine| JP5053078B2|2004-04-30|2012-10-17|ヒルクレスト・ラボラトリーズ・インコーポレイテッド|Handheld pointing device and method of operating the same| WO2005109847A2|2004-04-30|2005-11-17|Hillcrest Laboratories, Inc.|Methods and devices for identifying users based on tremor| US8479122B2|2004-07-30|2013-07-02|Apple Inc.|Gestures for touch sensitive input devices| US7295904B2|2004-08-31|2007-11-13|International Business Machines Corporation|Touch gesture based interface for motor vehicle| US20060095596A1|2004-11-03|2006-05-04|Yung Lin C|Solution for consumer electronics control| US7631278B2|2004-11-19|2009-12-08|Microsoft Corporation|System and method for directional focus navigation| CN101133385B|2005-03-04|2014-05-07|苹果公司|Hand held electronic device, hand held device and operation method thereof| US20060227117A1|2005-04-07|2006-10-12|Microsoft Corporation|Circular touch sensor| US7577925B2|2005-04-08|2009-08-18|Microsoft Corporation|Processing for distinguishing pen gestures and dynamic self-calibration of pen-based computing systems| DE102006018238A1|2005-04-20|2007-03-29|Logitech Europe S.A.|Remote control system for home theater system, analyzes log of events stored by remote controller to identify patterns of interest in logged use of remote controller| US7487461B2|2005-05-04|2009-02-03|International Business Machines Corporation|System and method for issuing commands based on pen motions on a graphical keyboard| JP4427486B2|2005-05-16|2010-03-10|株式会社東芝|Equipment operation device| US7301464B2|2005-05-24|2007-11-27|Electronic Data Systems Corporation|Process and method for safer vehicle navigation through facial gesture recognition and operator condition monitoring| US20100141578A1|2005-07-29|2010-06-10|Pioneer Corporation|Image display control apparatus, image display apparatus, remote controller, and image display system| US7907222B2|2005-09-08|2011-03-15|Universal Electronics Inc.|System and method for simplified setup of a universal remote control| JP2007081685A|2005-09-13|2007-03-29|Sony Corp|Image signal processor, image signal processing method, and image signal processing system| US9201507B2|2005-11-15|2015-12-01|Carefusion 303, Inc.|System and method for rapid input of data| US20070136778A1|2005-12-09|2007-06-14|Ari Birger|Controller and control method for media retrieval, routing and playback| TWI348639B|2005-12-16|2011-09-11|Ind Tech Res Inst|Motion recognition system and method for controlling electronic device| US20070177804A1|2006-01-30|2007-08-02|Apple Computer, Inc.|Multi-touch gesture dictionary| US8054294B2|2006-03-31|2011-11-08|Sony Corporation|Touch screen remote control system for use in controlling one or more devices| US7523439B2|2006-07-11|2009-04-21|Tokyo Electron Limited|Determining position accuracy of double exposure lithography using optical metrology| US7907117B2|2006-08-08|2011-03-15|Microsoft Corporation|Virtual controller for visual displays| US8564543B2|2006-09-11|2013-10-22|Apple Inc.|Media player with imaged based browsing| JP4979314B2|2006-09-13|2012-07-18|任天堂株式会社|GAME PROGRAM AND GAME DEVICE| US20080104547A1|2006-10-25|2008-05-01|General Electric Company|Gesture-based communications| TWI324304B|2006-12-15|2010-05-01|Inventec Corp|Method for reading data of input/output port| US20080143681A1|2006-12-18|2008-06-19|Xiaoping Jiang|Circular slider with center button| KR100720335B1|2006-12-20|2007-05-23|최경순|Apparatus for inputting a text corresponding to relative coordinates values generated by movement of a touch position and method thereof| US8977255B2|2007-04-03|2015-03-10|Apple Inc.|Method and system for operating a multi-function portable electronic device using voice-activation| JP5453246B2|2007-05-04|2014-03-26|クアルコム,インコーポレイテッド|Camera-based user input for compact devices| EP2003556A1|2007-05-25|2008-12-17|Axalto SA|Method of processing by a portable electronical device of applicative commands originating from physical channels, corresponding device and system| US20090096606A1|2007-10-11|2009-04-16|Remote Logics, Inc.|Remote obstruction sensing device| US8350971B2|2007-10-23|2013-01-08|Sling Media, Inc.|Systems and methods for controlling media devices| US20090121903A1|2007-11-12|2009-05-14|Microsoft Corporation|User interface with physics engine for natural gestural control| US9503562B2|2008-03-19|2016-11-22|Universal Electronics Inc.|System and method for appliance control via a personal communication or entertainment device| US8264381B2|2008-08-22|2012-09-11|Microsoft Corporation|Continuous automatic key control| US20100171635A1|2009-01-02|2010-07-08|Ewig Industries Macao Commerical Offshore, Ltd.|System And Method For Motion-Sensitive Remote Control For Audio-Visual Entertainment System| US8547326B2|2009-02-24|2013-10-01|Blackberry Limited|Handheld electronic device having gesture-based control and a method of using same| US8742885B2|2009-05-01|2014-06-03|Apple Inc.|Directional touch remote| US8438503B2|2009-09-02|2013-05-07|Universal Electronics Inc.|System and method for enhanced command input| US9520056B2|2010-05-11|2016-12-13|Universal Electronics Inc.|System and methods for enhanced remote control functionality|US8438503B2|2009-09-02|2013-05-07|Universal Electronics Inc.|System and method for enhanced command input| US9520056B2|2010-05-11|2016-12-13|Universal Electronics Inc.|System and methods for enhanced remote control functionality| US20120095575A1|2010-10-14|2012-04-19|Cedes Safety & Automation Ag|Time of flighthuman machine interface | US20120124500A1|2010-11-16|2012-05-17|Motorola Mobility, Inc.|Use of discrete input to control controllable device| US20120242514A1|2011-03-24|2012-09-27|Smile Technology Co., Ltd.|Hybrid keyboard| TW201322294A|2011-11-29|2013-06-01|Darfon Electronics Corp|Keyboard| KR101237472B1|2011-12-30|2013-02-28|삼성전자주식회사|Electronic apparatus and method for controlling electronic apparatus thereof| US9497509B2|2012-11-29|2016-11-15|Echostar Uk Holdings Limited|Navigation techniques for electronic programming guides and video| JP5821895B2|2013-05-22|2015-11-24|トヨタ自動車株式会社|Map display controller| EP2827332B1|2013-07-19|2020-09-09|Nxp B.V.|Navigating within a media item| US20150160779A1|2013-12-09|2015-06-11|Microsoft Corporation|Controlling interactions based on touch screen contact area| JP1525112S|2014-11-14|2015-06-01| USD777753S1|2014-11-14|2017-01-31|Espec Corp.|Display screen with graphical user interface| DE102014117544A1|2014-11-28|2016-06-02|Claas Saulgau Gmbh|Human machine interface for an agricultural implement, control system and agricultural implement| CN104867311A|2015-05-14|2015-08-26|连宁|Method and apparatus for configuring wireless remote control terminal through third-party terminal| US9977887B2|2015-09-17|2018-05-22|Sony Mobile Communications Inc.|Electronic device and method for validation of a trusted user| GB2552274A|2015-11-09|2018-01-17|Sky Cp Ltd|Television user interface| US20180275837A1|2017-03-23|2018-09-27|RideOn Ltd.|Graphical user interfacecontrols| US20180316963A1|2017-04-28|2018-11-01|Samsung Electronics Co., Ltd.|Display apparatus and method of operating the same| CN110636357A|2019-10-15|2019-12-31|东莞市安瑞创智能科技有限公司|Fingerprint quick identification unblock domestic remote controller|
法律状态:
2020-10-13| B06U| Preliminary requirement: requests with searches performed by other patent offices: suspension of the patent application procedure| 2020-12-08| B09A| Decision: intention to grant| 2021-02-17| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 10 (DEZ) ANOS CONTADOS A PARTIR DE 17/02/2021, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US12/552,761|US8438503B2|2009-09-02|2009-09-02|System and method for enhanced command input| US12/552,761|2009-09-02| US12/552761|2009-09-02| PCT/US2010/047268|WO2011028692A1|2009-09-02|2010-08-31|System and method for enhanced command input| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|